Investigation of Alternative Measures for Mutual Information

نویسندگان

چکیده

Mutual information I(X;Y) is a useful definition in theory to estimate how much the random variable Y holds about X. One way define mutual by comparing joint distribution of X and with product marginals through Kullback-Leibler (KL) divergence. If two distributions are close each other there will be almost no leakage from since variables being independent. In discrete setting has nice interpretation many bits reveals However, continuous case we do not have same reasoning. This fact enables us try different metrics or divergences information. this paper, evaluating form alternatives case. We deploy methods bound these evaluate their performances.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequence Alignment, Mutual Information, and Dissimilarity Measures for Constructing Phylogenies

BACKGROUND Existing sequence alignment algorithms use heuristic scoring schemes based on biological expertise, which cannot be used as objective distance metrics. As a result one relies on crude measures, like the p- or log-det distances, or makes explicit, and often too simplistic, a priori assumptions about sequence evolution. Information theory provides an alternative, in the form of mutual ...

متن کامل

Mutual information measures applied to EEG signals for sleepiness characterization.

Excessive daytime sleepiness (EDS) is one of the main symptoms of several sleep related disorders with a great impact on the patient lives. While many studies have been carried out in order to assess daytime sleepiness, the automatic EDS detection still remains an open problem. In this work, a novel approach to this issue based on non-linear dynamical analysis of EEG signal was proposed. Multic...

متن کامل

Mutual Information Measures for Subclass Error-Correcting Output Codes Classification

Error-Correcting Output Codes (ECOCs) reveal a common way to model multi-class classification problems. According to this state of the art technique, a multi-class problem is decomposed into several binary ones. Additionally, on the ECOC framework we can apply the subclasses technique (sub-ECOC), where by splitting the initial classes of the problem we aim to the creation of larger but easier t...

متن کامل

Dynamic Bayesian Information Measures

This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...

متن کامل

Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information

Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IFAC-PapersOnLine

سال: 2022

ISSN: ['2405-8963', '2405-8971']

DOI: https://doi.org/10.1016/j.ifacol.2022.09.016